Neural tensor contractions and the expressive power of deep neural quantum states

نویسندگان

چکیده

We establish a direct connection between general tensor networks and deep feed-forward artificial neural networks. The core of our results is the construction neural-network layers that efficiently perform contractions use commonly adopted nonlinear activation functions. resulting feature number edges closely match contraction complexity to be approximated. In context many-body quantum states, this result establishes states have strictly same or higher expressive power than practically usable variational As an example, we show all matrix product can written as with polynomial in bond dimension depth logarithmic system size. opposite instead does not hold true, imply there exist are expressible terms projected entangled pair but network states.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Expressive Power of Deep Neural Networks

We propose a new approach to the problem of neural network expressivity, which seeks to characterize how structural properties of a neural network family affect the functions it is able to compute. Our approach is based on an interrelated set of measures of expressivity, unified by the novel notion of trajectory length, which measures how the output of a network changes as the input sweeps alon...

متن کامل

On the Expressive Power of Deep Learning: A Tensor Analysis

It has long been conjectured that hypothesis spaces suitable for data that is compositional in nature, such as text or images, may be more efficiently represented with deep hierarchical architectures than with shallow ones. Despite the vast empirical evidence, formal arguments to date are limited and do not capture the kind of networks used in practice. Using tensor factorization, we derive a u...

متن کامل

Quantum-chemical insights from deep tensor neural networks

Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observabl...

متن کامل

Expressive power of recurrent neural networks

Deep neural networks are surprisingly efficient at solving practical tasks, but the theory behind this phenomenon is only starting to catch up with the practice. Numerous works show that depth is the key to this efficiency. A certain class of deep convolutional networks – namely those that correspond to the Hierarchical Tucker (HT) tensor decomposition – has been proven to have exponentially hi...

متن کامل

Deep Neural Network Approximation using Tensor Sketching

Deep neural networks are powerful learning models that achieve state-of-the-art performance on many computer vision, speech, and language processing tasks. In this paper, we study a fundamental question that arises when designing deep network architectures: Given a target network architecture can we design a “smaller” network architecture that “approximates” the operation of the target network?...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Physical review

سال: 2022

ISSN: ['0556-2813', '1538-4497', '1089-490X']

DOI: https://doi.org/10.1103/physrevb.106.205136